Information Theoretic Interpretations for H∞ Entropy

نویسندگان

  • Hui ZHANG
  • Youxian SUN
  • Hang Zhou
چکیده

Based on the studies on information transmission in discrete multivariable linear time invariant (LTI) system disturbed by stationary noise, relations within entropy rate, mutual information rate and H∞ entropy are discussed in both general control problem and classic tracking problem. For the general control systems, equivalent relations between entropy rate and H ∞ entropy are formulated by using spectral factorization. For the classic tracking systems, performance of disturbance rejection is measured by a difference of entropy rates (or mutual information rate in Gaussian case). This performance function is proved to be bounded by the H∞ entropy of the closed-loop transfer function from disturbance to output. These relations give information theoretic interpretations for the minimum entropy H∞ control theory. Potential application of these relations is discussed. Copyright © 2005 IFAC

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Nonlinear Stochastic Control and Information Theoretic Dualities: Connections, Interdependencies and Thermodynamic Interpretations

In this paper, we present connections between recent developments on the linearly-solvable stochastic optimal control framework with early work in control theory based on the fundamental dualities between free energy and relative entropy. We extend these connections to nonlinear stochastic systems with non-affine controls by using the generalized version of the Feynman–Kac lemma. We present alt...

متن کامل

The smooth entropy formalism for von Neumann algebras

We discuss information-theoretic concepts on infinite-dimensional quantum systems. In particular, we lift the smooth entropy formalism as introduced by Renner and collaborators for finite-dimensional systems to von Neumann algebras. For the smooth conditional minand max-entropy we recover similar characterizing properties and information-theoretic operational interpretations as in the finite-di...

متن کامل

Jeffreys ’ prior is asymptotically least favorable under entropy risk

We provide a rigorous proof that Jeffreys’ prior asymptotically maximizes Shannon’s mutual information between a sample of size n and the parameter. This was conjectured by Bernard0 (1979) and, despite the absence of a proof, forms the basis of the reference prior method in Bayesian statistical analysis. Our proof rests on an examination of large sample decision theoretic properties associated ...

متن کامل

Information theoretic measures of dependence, compactness, and non-gaussianity for multivariate probability distributions

A basic task of exploratory data analysis is the characterisation of “structure” in multivariate datasets. For bivariate Gaussian distributions, natural measures of dependence (the predictive relationship between individual variables) and compactness (the degree of concentration of the probability density function (pdf) around a low-dimensional axis) are respectively provided by ordinary least-...

متن کامل

A Game Theoretic Approach to Quantum Information

In this project, bridging entropy econometrics, game theory and information theory, a game theoretic approach will be investigated to quantum information, during which new mathematical definitions for quantum relative entropy, quantummutual information, and quantum channel capacity will be given and monotonicity of entangled quantum relative entropy and additivity of quantum channel capacity wi...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2005